feat: support ZAI token metadata, trigger compaction on idle sessions, add GLM system prompt#4710
Closed
processtrader wants to merge 3 commits intoanomalyco:devfrom
Closed
Conversation
3e90a27 to
91d5a92
Compare
f8ee907 to
6a9856d
Compare
841cdbf to
2f05145
Compare
IgorWarzocha
added a commit
to IgorWarzocha/opencode
that referenced
this pull request
Dec 31, 2025
Implements comprehensive GLM (Zhipu AI) model integration: Core Features: - Add GLM model detection and routing in session system - Extract token counts from ZAI's Anthropic-compatible metadata structure - Add ZAI provider to auth menu (zai-coding-plan, GLM-4.7) - Create comprehensive GLM system prompt with strict engineering constraints Technical Changes: - packages/opencode/src/session/index.ts: Extract tokens from metadata when top-level usage is 0 - packages/opencode/src/session/system.ts: Route GLM models to specialized system prompt - packages/opencode/src/session/prompt/glm.txt: New rigorous system prompt with XML-structured constraints - packages/opencode/src/provider/provider.ts: Add zai-coding-plan loader - packages/opencode/src/provider/models.ts: Inject default ZAI provider definition - packages/opencode/src/cli/cmd/auth.ts: Add ZAI to provider priority list (1.5) Refs: Adapted from upstream PR anomalyco#4710 with enhanced system prompt engineering
This was referenced Jan 20, 2026
00637c0 to
71e0ba2
Compare
f1ae801 to
08fa7f7
Compare
This was referenced Jan 31, 2026
Contributor
|
Closing this pull request because it has had no updates for more than 60 days. If you plan to continue working on it, feel free to reopen or open a new PR. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fix token extraction for ZAI/Anthropic API, improve auto-compaction timing for idle sessions, and add dedicated system prompt for GLM-4.6 model.
Changes
Token Extraction Fix (
session/index.ts)metadata.anthropic.usageinstead of the top-levelusageobject, causing the sidebar to show 0 tokens.metadata.anthropic.usagewhen top-level values are missing/zero.inputTokens) and snake_case (input_tokens) field names.Compaction Timing Fix (
session/prompt.ts)hasPendingCompactionguard to prevent infinite loops when a compaction task is already queued.GLM System Prompt (
session/prompt/glm.txt)Added a dedicated system prompt optimized for GLM-4.6's capabilities:
anthropic.txtfor maintainability.Testing